170 research outputs found
A refinement of Bennett's inequality with applications to portfolio optimization
A refinement of Bennett's inequality is introduced which is strictly tighter
than the classical bound. The new bound establishes the convergence of the
average of independent random variables to its expected value. It also
carefully exploits information about the potentially heterogeneous mean,
variance, and ceiling of each random variable. The bound is strictly sharper in
the homogeneous setting and very often significantly sharper in the
heterogeneous setting. The improved convergence rates are obtained by
leveraging Lambert's W function. We apply the new bound in a portfolio
optimization setting to allocate a budget across investments with heterogeneous
returns
Frank-Wolfe Algorithms for Saddle Point Problems
We extend the Frank-Wolfe (FW) optimization algorithm to solve constrained
smooth convex-concave saddle point (SP) problems. Remarkably, the method only
requires access to linear minimization oracles. Leveraging recent advances in
FW optimization, we provide the first proof of convergence of a FW-type saddle
point solver over polytopes, thereby partially answering a 30 year-old
conjecture. We also survey other convergence results and highlight gaps in the
theoretical underpinnings of FW-style algorithms. Motivating applications
without known efficient alternatives are explored through structured prediction
with combinatorial penalties as well as games over matching polytopes involving
an exponential number of constraints.Comment: Appears in: Proceedings of the 20th International Conference on
Artificial Intelligence and Statistics (AISTATS 2017). 39 page
Recommended from our members
Square Root Propagation
We propose a message propagation scheme for numerically stable inference in Gaussian graphical models which can otherwise be susceptible to errors caused by finite numerical precision. We adapt square root algorithms, popular in Kalman filtering, to graphs with arbitrary topologies. The method consists of maintaining potentials and generating messages that involve the square root of precision matrices. Combining this with the machinery of the junction tree algorithm leads to an efficient and numerically stable algorithm. Experiments are presented to demonstrate the robustness of the method to numerical errors that can arise in complex learning and inference problems
- …